home *** CD-ROM | disk | FTP | other *** search
- #
- #
- #---- example1.lf
- #
- #
- #---- This is an lf example that learns both the XOR, NXOR,
- #---- and AND functions of two binary inputs.
- #
- #---- The trees are saved into the file example1.tre
- #---- The encodings are saved into the file example1.cod
- #---- Note that a saved set of trees must be accompanied
- #---- by its corresponding encodings if the tree is to function
- #---- properly in future trials where the trees are loaded
- #---- instead of generated.
- #
-
- #---- Specify tree statements.
- tree
-
- #---- Train on trees of 512 leaves.
- size = 512
-
- #---- Train until we get 4 elements of the training set right
- min correct = 4
-
- #---- or until 10 epochs have passed.
- max epochs = 10
-
- #---- Output folded trees for later retrieval and evaluation
- save folded tree to "example1.tre"
-
- #---- Specify function statements.
- function
-
- #---- Domain dimension MUST be the first statement, followed
- #---- by the codomain dimension statement.
- domain dimension = 2
-
- #---- We are training on 3 functions at once, XOR, NXOR, and AND
- #---- which means there are 3 codimensions.
- codomain dimension = 3
-
- #---- Coding output will be saved for use with the trees we are saving.
- save coding to "example1.cod"
-
- #---- All dimensions and codimensions are boolean, so specify
- #---- bits:stepsize for the encoding of input and output.
- coding = 1:1 1:1 1:1 1:1 1:1
-
- #---- Boolean values have 2 quantization levels.
- quantization = 2 2 2 2 2
-
- #---- Optional specifications of the largest values in the 5 encodings;
- #---- if not specified, then the largest value in the training and test set
- #---- is used.
- largest = 1 1 1 1 1
-
- #---- Optional specifications of the smallest values in the 5 encodings;
- #---- if not specified, then the smallest value in the training and test set
- #---- is used.
- #---- Note that the smallest values may not equal the largest values.
- smallest = 0 0 0 0 0
-
- #---- There are four rows in our training set.
- training set size = 4
- training set =
-
- # A B A xor B A nxor B A and B
- 1 1 0 1 1
- 1 0 1 0 0
- 0 1 1 0 0
- 0 0 0 1 0
-
- #---- We will test on the following 4 vectors.
- test set size = 4
- test set =
- # A B A xor B A nxor B A and B
- 1 1 0 1 1
- 1 0 1 0 0
- 0 1 1 0 0
- 0 0 0 1 0
-
- #---- The following output file should be generated:
- #---- The first line indicates how many codomains there are.
- #---- The next four lines represent each of the four lines in the test set.
- #---- Each value is followed by its corresponding quantization number
- #---- in the prescribed encoding scheme. Each codomain is followed
- #---- by the corresponding result from the ALN's, along with its quantization
- #---- number. Remember, it's not the calculated value that is as important
- #---- as the calculated quantization level. You can get more accurate values
- #---- by tightening up the encoding: increasing the number of quantization
- #---- levels.
-
- #---- After the results is the error histogram, which counts,
- #---- for each of the codomains, the number of times the result quantization
- #---- level differed from the actual quantization level by n. In this example,
- #---- the ALN's executed the test set perfectly, so there are 4 counts for
- #---- errors of n = 0 in each of the 3 codomains.
-
- # A B A xor B A xor B result A nxor B A nxor B result A and B A and B result
-
- #3
- #1.000000 1 1.000000 1 0.000000 0 0.000000 0 1.000000 1 1.000000 1 1.000000 1 1.000000 1
- #1.000000 1 0.000000 0 1.000000 1 1.000000 1 0.000000 0 0.000000 0 0.000000 0 0.000000 0
- #0.000000 0 1.000000 1 1.000000 1 1.000000 1 0.000000 0 0.000000 0 0.000000 0 0.000000 0
- #0.000000 0 0.000000 0 0.000000 0 0.000000 0 1.000000 1 1.000000 1 0.000000 0 0.000000 0
- #
- #ERROR HISTOGRAM
- #0 errors 4 4 4
- #1 errors 0 0 0
- #2 errors 0 0 0
- #3 errors 0 0 0
- #4 errors 0 0 0
- #5 errors 0 0 0
- #6 errors 0 0 0
- #7 errors 0 0 0
- #8 errors 0 0 0
- #9+ errors 0 0 0
-